A smoothing stochastic gradient method for composite optimization

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Smoothing Stochastic Gradient Method for Composite Optimization

We consider the unconstrained optimization problem whose objective function is composed of a smooth and a non-smooth conponents where the smooth component is the expectation a random function. This type of problem arises in some interesting applications in machine learning. We propose a stochastic gradient descent algorithm for this class of optimization problem. When the non-smooth component h...

متن کامل

A Sparsity Preserving Stochastic Gradient Method for Composite Optimization

We propose new stochastic gradient algorithms for solving convex composite optimization problems. In each iteration, our algorithms utilize a stochastic oracle of the gradient of the smooth component in the objective function. Our algorithms are based on a stochastic version of the estimate sequence technique introduced by Nesterov (Introductory Lectures on Convex Optimization: A Basic Course, ...

متن کامل

Inexact proximal stochastic gradient method for convex composite optimization

We study an inexact proximal stochastic gradient (IPSG) method for convex composite optimization, whose objective function is a summation of an average of a large number of smooth convex functions and a convex, but possibly nonsmooth, function. Variance reduction techniques are incorporated in the method to reduce the stochastic gradient variance. The main feature of this IPSG algorithm is to a...

متن کامل

Accelerated Stochastic Gradient Method for Composite Regularization

Regularized risk minimization often involves nonsmooth optimization. This can be particularly challenging when the regularizer is a sum of simpler regularizers, as in the overlapping group lasso. Very recently, this is alleviated by using the proximal average, in which an implicitly nonsmooth function is employed to approximate the composite regularizer. In this paper, we propose a novel extens...

متن کامل

Spectral projected gradient method for stochastic optimization

We consider the Spectral Projected Gradient method for solving constrained optimization porblems with the objective function in the form of mathematical expectation. It is assumed that the feasible set is convex, closed and easy to project on. The objective function is approximated by a sequence of Sample Average Approximation functions with different sample sizes. The sample size update is bas...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Optimization Methods and Software

سال: 2014

ISSN: 1055-6788,1029-4937

DOI: 10.1080/10556788.2014.891592